10 - 708 : Probabilistic Graphical Models 10 - 708 , Spring 2014 13 : Variational Inference : Loopy Belief Propagation
نویسندگان
چکیده
The problem of probabilistic inference concerns answering queries about conditional and marginal probabilities in graphical models. Consider two disjoint subsets E and F of the nodes in a graphical model G. A query regarding marginal distribution p(xF ) can be calculated by the marginalization operation ∑ G\F p(x). A query regarding conditional distribution p(xF |xE) can be calculated by p(xF |xE) = p(xF ,xE) p(xE) . A query could also ask to compute a mode of the density x̂ = arg maxx∈Xm p(x). In the previous lectures, we have learnt many exact inference techniques such as naive brute force marginalization, variable elimination and family of message passing algorithms such as sum-product, belief propagation and junction tree. In brute force and variable elimination techniques, individual queries are computed independently and as a result several intermediate terms may be computed repeatedly, while the message passing algorithms allows sharing of intermediate terms and hence is more effective in the long run.
منابع مشابه
Locally Conditioned Belief Propagation
Conditioned Belief Propagation (CBP) is an algorithm for approximate inference in probabilistic graphical models. It works by conditioning on a subset of variables and solving the remainder using loopy Belief Propagation. Unfortunately, CBP’s runtime scales exponentially in the number of conditioned variables. Locally Conditioned Belief Propagation (LCBP) approximates the results of CBP by trea...
متن کاملGraphical models and symmetries: loopy belief propagation approaches
Whenever a person or an automated system has to reason in uncertain domains, probability theory is necessary. Probabilistic graphical models allow us to build statistical models that capture complex dependencies between random variables. Inference in these models, however, can easily become intractable. Typical ways to address this scaling issue are inference by approximate message-passing, sto...
متن کاملLoopy Belief Propagation: Bayesian Networks for Multi-Criteria Decision Making (MCDM)
Loopy Belief propagation is an increasingly popular method of performing approximate inference on arbitrary graphical models. Bayesian network is a graphical model that encodes probabilistic relationships among variables of interest. When used in conjunction with statistical techniques, the graphical model has several advantages for data mining. Influence diagrams provide a compact technique to...
متن کاملLearning Cost-Aware, Loss-Aware Approximate Inference Policies for Probabilistic Graphical Models
Probabilistic graphical models are typically trained to maximize the likelihood of the training data and evaluated on some measure of accuracy on the test data. However, we are also interested in learning to produce predictions quickly. For example, one can speed up loopy belief propagation by choosing sparser models and by stopping at some point before convergence. We manage the speed-accuracy...
متن کاملVariational loopy belief propagation for multi-talker speech recognition
We address single-channel speech separation and recognition by combining loopy belief propagation and variational inference methods. Inference is done in a graphical model consisting of an HMM for each speaker combined with the max interaction model of source combination. We present a new variational inference algorithm that exploits the structure of the max model to compute an arbitrarily tigh...
متن کامل